翻訳と辞書
Words near each other
・ Tokiwa Kitareri
・ Tokiwa Station
・ Token bus network
・ Token coin
・ Token Creek, Wisconsin
・ Token economy
・ Token Homemate Cup
・ Token money
・ Token of Darkness
・ Token passing
・ Token Racing
・ Token reconfiguration
・ Token ring
・ Tokenism
・ Tokenization
Tokenization (data security)
・ Tokenization (lexical analysis)
・ Tokenzone
・ Toker
・ Toker Dam
・ Tokers Green
・ Tokerud School
・ Toketee Falls
・ Toketee State Airport
・ Tokey Banta
・ Tokey Hill
・ Tokha
・ Tokha Chandeswori
・ Tokha municipality
・ Tokha Sarswoti


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Tokenization (data security) : ウィキペディア英語版
Tokenization (data security)
Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no extrinsic or exploitable meaning or value. The token is a reference (i.e. identifier) that maps back to the sensitive data through a tokenization system. The mapping from original data to a token uses methods which render tokens infeasible to reverse in the absence of the tokenization system, for example using tokens created from random numbers.〔 (CardVault: "Tokenization 101" )〕 The tokenization system must be secured and validated using security best practices 〔(OWASP Top Ten Project )〕 applicable to sensitive data protection, secure storage, audit, authentication and authorization. The tokenization system provides data processing applications with the authority and interfaces to request tokens, or detokenize back to sensitive data.

The security and risk reduction benefits of tokenization require that the tokenization system is logically isolated and segmented from data processing systems and applications that previously processed or stored sensitive data replaced by tokens. Only the tokenization system can tokenize data to create tokens, or detokenize back to redeem sensitive data under strict security controls. The token generation method must be proven to have the property that there is no feasible means through direct attack, cryptanalysis, side channel analysis, token mapping table exposure or brute force techniques to reverse tokens back to live data.
When tokens replace live data in systems, the result is minimized exposure of sensitive data to those applications, stores, people and processes, reducing risk of compromise or accidental exposure and unauthorized access to sensitive data. Applications can operate using tokens instead of live data, with the exception of a small number of trusted applications explicitly permitted to detokenize when strictly necessary for an approved business purpose. Tokenization systems may be operated in-house within a secure isolated segment of the data center, or as a service from a secure service provider.
Tokenization may be used to safeguard sensitive data involving, for example, bank accounts, financial statements, medical records, criminal records, driver's licenses, loan applications, stock trades, voter registrations, and other types of personally identifiable information (PII). Tokenization is often used in credit card processing. The PCI Council defines tokenization as "a process by which the primary account number (PAN) is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated PAN value. The security of an individual token relies predominantly on the infeasibility of determining the original PAN knowing only the surrogate value".〔(PCI DSS Tokenization Guidelines )〕 The choice of tokenization as an alternative to other techniques such as encryption will depend on varying regulatory requirements, interpretation, and acceptance by respective auditing or assessment entities. This is in addition to any technical, architectural or operational constraint that tokenization imposes in practical use.
== Concepts and origins ==

The concept of tokenization, as adopted by the industry today, has existed since the first currency systems emerged centuries ago as a means to reduce risk in handling high value financial instruments by replacing them with surrogate equivalents. In the physical world, coin tokens have a long history of use replacing the financial instrument of minted coins and bank notes. In more recent history, subway tokens and casino chips found adoption for their respective ecosystems to replace physical currency and cash handling risks such as theft.
Exonumia, and scrip are terms synonymous with such tokens.
In the digital world, similar substitution techniques have been used since the 1970s as a means to isolate real data elements from exposure to other data ecosystems. In databases for example, surrogate key values have been used since 1976 to isolate data associated with the internal mechanisms of databases and their external equivalents for a variety of uses in data processing. More recently, these concepts have been extended to consider this isolation tactic to provide a security mechanism for the purposes of data protection.
In the payment card industry, tokenization is one means of protecting sensitive cardholder data in order to comply with industry standards and government regulations.〔("Tokenization eases merchant PCI compliance" )〕 Tokenization was applied to payment card data by Shift4 Corporation 〔(“Shift4 Corporation Releases Tokenization in Depth White Paper” )〕 and released to the public during an industry Security Summit in Las Vegas, Nevada in 2005.〔(Shift4 Launches Security Tool That Lets Merchants Re-Use Credit Card Data ). Internet Retailer〕 The technology is meant to prevent the theft of the credit card information in storage. Shift4 defines tokenization as: “The concept of using a non-decryptable piece of data to represent, by reference, sensitive or secret data. In payment card industry (PCI) context, tokens are used to reference cardholder data that is managed in a tokenization system, application or off-site secure facility.”.〔("Shift4 Corporation Releases Tokenization in Depth White Paper" )〕
To protect data over its full lifecycle, tokenization is often combined with end-to-end encryption to secure data in transit to the tokenization system or service, with a token replacing the original data on return. For example, to avoid the risks of malware stealing data from low-trust systems such as point of sale (POS) systems, as in the (Target breach of 2013 ), cardholder data encryption must take place prior to card data entering the POS and not after. Encryption takes place within the confines of a security hardened and validated card reading device and data remains encrypted until received by the processing host, an approach pioneered by Heartland Payment Systems〔(Lessons Learned from a Data Breach )〕 as a means to secure payment data from advanced threats, now widely adopted by industry payment processing companies and technology companies.〔(Voltage, Ingencio Partner on Data Encryption Platform )〕 The PCI Council has also specified end-to-end encryption (certified point-to-point encryption—P2PE) for various service implementations in various (PCI Council Point-to-point Encryption ) documents.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Tokenization (data security)」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.